Exclusive lasso-based k-nearest-neighbor classification
نویسندگان
چکیده
Conventionally, the k nearest-neighbor (kNN) classification is implemented with use of Euclidean distance-based measures, which are mainly one-to-one similarity relationships such as to lose connections between different samples. As a strategy alleviate this issue, coefficients coded by sparse representation have played role gauger for well. Although SR enjoy remarkable discrimination nature one-to-many relationship, it carries out variable selection at individual level so that possible inherent group structure ignored. In order make most information implied in structure, paper employs exclusive lasso perform evaluation two novel methods. Experimental results on both benchmark data sets and face recognition problem demonstrate EL-based kNN method outperforms certain state-of-the-art techniques existing representation-based approaches, terms size feature reduction accuracy.
منابع مشابه
Problem Set 1 K-nearest Neighbor Classification
In this part, you will implement k-Nearest Neighbor (k-NN) algorithm on the 8scenes category dataset of Oliva and Torralba [1]. You are given a total of 800 labeled training images (containing 100 images for each class) and 1888 unlabeled testing images. Figure 1 shows some sample images from the data set. Your task is to analyze the performance of k-NN algorithm in classifying photographs into...
متن کاملIKNN: Informative K-Nearest Neighbor Pattern Classification
The K-nearest neighbor (KNN) decision rule has been a ubiquitous classification tool with good scalability. Past experience has shown that the optimal choice of K depends upon the data, making it laborious to tune the parameter for different applications. We introduce a new metric that measures the informativeness of objects to be classified. When applied as a query-based distance metric to mea...
متن کاملk-Nearest Neighbor Classification on Spatial Data
Classification of spatial data streams is crucial, since the training dataset changes often. Building a new classifier each time can be very costly with most techniques. In this situation, k-nearest neighbor (KNN) classification is a very good choice, since no residual classifier needs to be built ahead of time. KNN is extremely simple to implement and lends itself to a wide variety of variatio...
متن کاملPrivacy Preserving K-nearest Neighbor Classification
This paper considers how to conduct k-nearest neighbor classification in the following scenario: multiple parties, each having a private data set, want to collaboratively build a k-nearest neighbor classifier without disclosing their private data to each other or any other parties. Specifically, the data are vertically partitioned in that all parties have data about all the instances involved, ...
متن کاملK-Nearest Neighbor Classification Using Anatomized Data
This paper analyzes k nearest neighbor classification with training data anonymized using anatomy. Anatomy preserves all data values, but introduces uncertainty in the mapping between identifying and sensitive values. We first study the theoretical effect of the anatomized training data on the k nearest neighbor error rate bounds, nearest neighbor convergence rate, and Bayesian error. We then v...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computing and Applications
سال: 2021
ISSN: ['0941-0643', '1433-3058']
DOI: https://doi.org/10.1007/s00521-021-06069-5